Why we use dependency grammar

نویسنده

  • Eva Hajicová
چکیده

I . First, I would like to say why I do care E.~mmmr formalisms. The point is not only t h a % I was %ra.ined a s a student of linguist-i c s and %hat I always ha.re been interested in theor~l Lcal ling'uJ, stics, but in the present context a.]SO ) and main].~ ~ that natu:e&:[ ].angu a g e processing systems mostly a~'e too complex to be bui].t, modified, comp].emented, enriched.,., o , without a solid theoretical baekground~ As Prof.Nagao puts it, theory Js i m p o r t a n t and v a l u a b l e f o r t h e explanaJ;ion ~).nd u n d c r s t a . n d i n g ~ a. ] a n g u & g e processing me.-de]. ~heu].d be u n d e r s t a n d a b l e on t h e b a c h g r o u n d (,f a p o w e r f u l ] . i n g u i s t i c t h e o r y . On t h e o t h e r h a n d ~ :[ would lille I;o s t r e s s that i f l i n ~ g t l J l l t i e s w a n t s -to be n s e f u ] , a~l{] t o lllaqke safe i t s own p e r s p e e t : i v e s , t h e n i t h a s t o be u s e f u l f o r :l.l.ng!\igi~J_~ q en__q6~ineeri_q C. T h i e m e a n s f o r me tha t ; t h e t h e o r y has 9o bc n o t o .n l y a d e q u a t e , hut a l s o e c o n o m i c a l and m e r l u ] a r . The t},s]~ o f t h e t h e o r y i s t o o f f e r a r e ] . a t i vel:7 complete framework, which never captures all th.e detai].s in their specific and often exeept : i , ona]. e h a r a c % e r , bu~ whic}! , am ] ( a r e n J e n s e n n o t e s i n her p o i n t ( 7 ) ~ o [ ' f ( . ' r s a m a x i m a l c o v e r a g e , i . e . w h i c h ceYi t~ , i r t .q means necessary and suff: ' : i .eien% for } iand l in t< a ] ] . :]uch d e t a i ] s am :Far a s they ~w'e re ]ev ,qx~ t for ' t h e / ~ i v e n a p p , ] i c a t i o n f i e ] d , I n "tl~:i.s r e s p e c t , %he t h e c > r e t i c a ] f r a m e w o r k e~m be c o m p a r e d to a fi [ ;he rman" S net ~ which nee(I n()'~ be l l s ed ' w h o l e , i f t h i s i s no't; n e c e s s a r y f o r ~be g i v e n p o o l ; some of the m e s h e s may be l e f t u n u s e d :il~ t h e ]~eag or a s h o r e , 'but i n a l a r g e r p c o l they may be useful. The most important p e i n t is that the meshes a r e t h e r e , and we know w b e r e [ ; hey are and for w h a t purpose " they might be 'useful° 2o The f o r m a l i s m i s / t o t the o n l y i m p o r t ~ l n ~ i n g r e d i e n i ; of a.n NLP system, and it I s n o t interesting here for its own ~akeo :[% i s true that %he bottleneck of an NLP system Is in hanjling-the "dirty" exceptional cases, ra'ther "i;han the cases dlreet].y fitting into the main body of this or tha,% theory. As a ma-l;%er o f fact, using a n y theory, we have. %o . face such intricate but common examples as Kirschner's cases of target language amblguity (or vagueness) corresponding e.g. to that of F, nglish ,i.n_~[-forms , or the long but lexiea]ly bound sequences of nouns in termino].ogioal noun groups, or a procedure translating ].exieal items by modifying the productive affixes of int.ernational terms of (]reek and ],atin origin and other "emergency rules" ensuring that 6t leas% an approximate (at leatvt partially readable) output will be

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Why So Many Nodes?

This paper provides an analysis of the representation of various grammatical phenomena in both constituency structure and dependency structure (hereafter c-structure and d-structure), including agreement, case marking, and word order in transitive sentences, as well as three theoretical constructs, and the interface between the form of a sentence and its meaning. There is a crucial structural r...

متن کامل

Concavity and Initialization for Unsupervised Dependency Grammar Induction

We examine models for unsupervised learning with concave log-likelihood functions. We begin with the most well-known example, IBM Model 1 for word alignment (Brown et al., 1993), and study its properties, discussing why other models for unsupervised learning are so seldom concave. We then present concave models for dependency grammar induction and validate them experimentally. Despite their sim...

متن کامل

Enriching Lexical Transfer With Cross-Linguistic Semantic Features or How to Do Interlingua without Interlingua

In this paper, we propose an alternative to interlingua which can capture the analyses and generalizations that interlinguas can express, but which uses cross-linguistic semantic features rather than a separate level of representation. This alternative we call lexico-structural transfer. Lexico-structural transfer relies on the expressive power of a lexicalized syntactic representation (or “lex...

متن کامل

Concavity and Initialization for Unsupervised Dependency Parsing

We investigate models for unsupervised learning with concave log-likelihood functions. We begin with the most well-known example, IBM Model 1 for word alignment (Brown et al., 1993) and analyze its properties, discussing why other models for unsupervised learning are so seldom concave. We then present concave models for dependency grammar induction and validate them experimentally. We find our ...

متن کامل

<b>Invited Talk:</b> Slacker Semantics: Why Superficiality, Dependency and Avoidance of Commitment can be the Right Way to Go

This paper discusses computational compositional semantics from the perspective of grammar engineering, in the light of experience with the use of Minimal Recursion Semantics in DELPH-IN grammars. The relationship between argument indexation and semantic role labelling is explored and a semantic dependency notation (DMRS) is introduced.

متن کامل

The Shared Logistic Normal Distribution for Grammar Induction

We present a shared logistic normal distribution as a Bayesian prior over probabilistic grammar weights. This approach generalizes the similar use of logistic normal distributions [3], enabling soft parameter tying during inference across different multinomials comprising the probabilistic grammar. We show that this model outperforms previous approaches on an unsupervised dependency grammar ind...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1988